Efficient Private Empirical Risk Minimization for High-dimensional Learning

نویسندگان

  • Shiva Prasad Kasiviswanathan
  • Hongxia Jin
چکیده

Dimensionality reduction is a popular approach for dealing with high dimensional data that leads to substantial computational savings. Random projections are a simple and effective method for universal dimensionality reduction with rigorous theoretical guarantees. In this paper, we theoretically study the problem of differentially private empirical risk minimization in the projected subspace (compressed domain). We ask: is it possible to design differentially private algorithms with small excess risk given access to only projected data? In this paper, we answer this question in affirmative, by showing that for the class of generalized linear functions, given only the projected data and the projection matrix, we can obtain excess risk bounds of Õ(w(C)/n) under -differential privacy, and Õ( √ w(C)/n) under ( , δ)-differential privacy, where n is the sample size and w(C) is the Gaussian width of the parameter space C that we optimize over. A simple consequence of these results is that, for a large class of ERM problems, in the traditional setting (i.e., with access to the original data), under -differential privacy, we improve the worstcase risk bounds of (Bassily et al., 2014).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning in a Large Function Space: Privacy-Preserving Mechanisms for SVM Learning

Several recent studies in privacy-preserving learning have considered the trade-o between utility or risk and the level of di erential privacy guaranteed by mechanisms for statistical query processing. In this paper we study this trade-o in private Support Vector Machine (SVM) learning. We present two e cient mechanisms, one for the case of nite-dimensional feature mappings and one for potentia...

متن کامل

Differentially Private Empirical Risk Minimization Revisited: Faster and More General

In this paper we study the differentially private Empirical Risk Minimization (ERM) problem in different settings. For smooth (strongly) convex loss function with or without (non)-smooth regularization, we give algorithms that achieve either optimal or near optimal utility bounds with less gradient complexity compared with previous work. For ERM with smooth convex loss function in high-dimensio...

متن کامل

Private Convex Optimization for Empirical Risk Minimization with Applications to High-dimensional Regression

We consider differentially private algorithms for convex empirical risk minimization (ERM). Differential privacy (Dwork et al., 2006b) is a recently introduced notion of privacy which guarantees that an algorithm’s output does not depend on the data of any individual in the dataset. This is crucial in fields that handle sensitive data, such as genomics, collaborative filtering, and economics. O...

متن کامل

Efficient Empirical Risk Minimization with Smooth Loss Functions in Non-interactive Local Differential Privacy

In this paper, we study the Empirical Risk Minimization problem in the non-interactive local model of differential privacy. We first show that if the ERM loss function is (∞, T )-smooth, then we can avoid a dependence of the sample complexity, to achieve error α, on the exponential of the dimensionality p with base 1/α (i.e., α−p), which answers a question in (Smith et al., 2017). Our approach ...

متن کامل

Private Convex Empirical Risk Minimization and High-dimensional Regression

We consider differentially private algorithms for convex empirical risk minimization (ERM). Differential privacy (Dwork et al., 2006b) is a recently introduced notion of privacy which guarantees that an algorithm’s output does not depend on the data of any individual in the dataset. This is crucial in fields that handle sensitive data, such as genomics, collaborative filtering, and economics. O...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016